221 research outputs found
OutFlank Routing: Increasing Throughput in Toroidal Interconnection Networks
We present a new, deadlock-free, routing scheme for toroidal interconnection
networks, called OutFlank Routing (OFR). OFR is an adaptive strategy which
exploits non-minimal links, both in the source and in the destination nodes.
When minimal links are congested, OFR deroutes packets to carefully chosen
intermediate destinations, in order to obtain travel paths which are only an
additive constant longer than the shortest ones. Since routing performance is
very sensitive to changes in the traffic model or in the router parameters, an
accurate discrete-event simulator of the toroidal network has been developed to
empirically validate OFR, by comparing it against other relevant routing
strategies, over a range of typical real-world traffic patterns. On the
16x16x16 (4096 nodes) simulated network OFR exhibits improvements of the
maximum sustained throughput between 14% and 114%, with respect to Adaptive
Bubble Routing.Comment: 9 pages, 5 figures, to be presented at ICPADS 201
Data Structures for Task-based Priority Scheduling
Many task-parallel applications can benefit from attempting to execute tasks
in a specific order, as for instance indicated by priorities associated with
the tasks. We present three lock-free data structures for priority scheduling
with different trade-offs on scalability and ordering guarantees. First we
propose a basic extension to work-stealing that provides good scalability, but
cannot provide any guarantees for task-ordering in-between threads. Next, we
present a centralized priority data structure based on -fifo queues, which
provides strong (but still relaxed with regard to a sequential specification)
guarantees. The parameter allows to dynamically configure the trade-off
between scalability and the required ordering guarantee. Third, and finally, we
combine both data structures into a hybrid, -priority data structure, which
provides scalability similar to the work-stealing based approach for larger
, while giving strong ordering guarantees for smaller . We argue for
using the hybrid data structure as the best compromise for generic,
priority-based task-scheduling.
We analyze the behavior and trade-offs of our data structures in the context
of a simple parallelization of Dijkstra's single-source shortest path
algorithm. Our theoretical analysis and simulations show that both the
centralized and the hybrid -priority based data structures can give strong
guarantees on the useful work performed by the parallel Dijkstra algorithm. We
support our results with experimental evidence on an 80-core Intel Xeon system
Estimating Peak-Hour Traffic Congestion Patterns For Interacting Agents On Urban Networks
We study the emergence of congestion patterns in urban networks by modeling
vehicular interaction by means of a simple traffic rule and by using a set of
measures inspired by the standard Betweenness Centrality (BC). We consider a
topologically heterogeneous group of cities and simulate the network loading
during the morning peak-hour by increasing the number of circulating vehicles.
At departure, vehicles are aware of the network state and choose paths with
optimal traversal time. Each added path modifies the vehicular density and
travel times for the following vehicles. Starting from an empty network and
adding traffic until transportation collapses, provides a framework to study
network's transition to congestion and how connectivity is progressively
disrupted as the fraction of impossible paths becomes abruptly dominant. We use
standard BC to probe into the instantaneous out-of-equilibrium network state
for a range of traffic levels and show how this measure may be improved to
build a better proxy for cumulative road usage during peak-hours. We define a
novel dynamical measure to estimate cumulative road usage and the associated
total time spent over the edges by the population of drivers. We also study how
congestion starts with dysfunctional edges scattered over the network, then
organizes itself into relatively small, but disruptive clusters.Comment: 8 pages, accepted at Complex Networks 2022 Palerm
Clinical conundrum. Three management strategies for three-vessel coronary artery disease?
Inspired by King’s word, the goal of optimal pharmacotherapy is optimizing patient outcomes in an appropriate and consistent fashion, integrating itself with other management strategies, when and as appropriate, but this goal cannot be achieved if such therapy is not implemented in a forceful and proactive fashion. Indeed, cardiovascular pharmacotherapy for ischemic heart disease due to coronary artery disease (CAD) represents a unique case study in this sense, given the complex interplay between societal and individual preventive strategies as well as clinical treatments aimed at secondary or tertiary prevention, which may apparently challenge immediate and thorough implementation
Adaptive Image Contrast Enhancement by Computing Distances into a 4-Dimensional Fuzzy Unit Hypercube
A new fuzzy procedure for adaptive gray-level image contrast enhancement (CE) is presented in this paper. Starting from the pixels belonging to a normalized gray-level image, an appropriate smooth S-shaped fuzzy membership function (MF) is considered for gray-scale transformation and is adaptively developed through noise reduction and information loss minimization. Then, a set of fuzzy patches is extracted from the MF, and for each support of each patch, we compute four ascending-order statistics that become points inside a 4-D fuzzy unit hypercube after a suitable fuzzification step. CE is performed by computing the distances among the above points and the points of maximum darkness and maximum brightness (special vertexes in the hypercube), and by determining the rotation of the tangent line to the MF around a crucial point where fuzzy patches and the MF coexist. The proposed procedure enables high CE in all the treated images with performance that is fully comparable with that obtained by three more sophisticated fuzzy techniques and by standard histogram equalization
Joint use of eddy current imaging and fuzzy similarities to assess the integrity of steel plates
AbstractSteel plates bi-axially loaded are characterized by mechanical deformations whose 2D image representations are very difficult to achieve. In this work, the authors propose an innovative approach based on eddy current techniques for obtaining 2D electrical maps to assess the mechanical integrity of a steel plate. The procedure, also exploiting fuzzy similarity computations, translates the problem of the assessment of the mechanical integrity of a steel plate into a suitable classification problem. The results obtained by this proposed procedure show performances comparable to those provided by well-established soft computing approaches with a higher computational complexity
Alluvione di Messina 2009: NGAL in due pazienti con Crush Syndrome
Neutrophil Gelatinase-Associated Lipocalin (NGAL) è uno dei più promettenti biomarcatori utilizzati nella diagnosi di "Acute Kidney Injury" (AKI), dal momento che il suo incremento è un buon predittore a breve termine dello sviluppo di insufficienza renale acuta in notevole anticipo rispetto all'incremento dei valori della creatinia sierica. Riportiamo la nostra esperienza di un caso di Crush Syndrome di due pazienti vittime dell'alluvione che ha coinvolto Messina. Lo sviluppo di AKI in seguito a Crush Syndrome è la seconda causa più comune di morte in seguito a terremoti o altri disastri naturali ma allo stesso tempo è una complicanza disastro-correlata che può essere reversibile in particolar modo in caso di diagnosi precoce e di altrettanto precoce trattamento. In questo caso, l'NGAL ci ha permesso di fare una diagnosi precoce di AKI preannunciando le alterazioni dei classici marker come la creatinina, inoltre abbiamo notato la correlazione diretta tra i valori di NGAL, l'evoluzione del danno renale e la prognosi per le due pazienti
Relationship between coronary plaque morphology of the left anterior descending artery and 12 months clinical outcome: the CLIMA study
Abstract
Aims
The CLIMA study, on the relationship between coronary plaque morphology of the left anterior descending artery and twelve months clinical outcome, was designed to explore the predictive value of multiple high-risk plaque features in the same coronary lesion [minimum lumen area (MLA), fibrous cap thickness (FCT), lipid arc circumferential extension, and presence of optical coherence tomography (OCT)-defined macrophages] as detected by OCT. Composite of cardiac death and target segment myocardial infarction was the primary clinical endpoint.
Methods and results
From January 2013 to December 2016, 1003 patients undergoing OCT evaluation of the untreated proximal left anterior descending coronary artery in the context of clinically indicated coronary angiogram were prospectively enrolled at 11 independent centres (clinicaltrial.gov identifier NCT02883088). At 1-year, the primary clinical endpoint was observed in 37 patients (3.7%). In a total of 1776 lipid plaques, presence of MLA 180° (HR 2.4, 95% CI 1.2–4.8), and OCT-defined macrophages (HR 2.7, 95% CI 1.2–6.1) were all associated with increased risk of the primary endpoint. The pre-specified combination of plaque features (simultaneous presence of the four OCT criteria in the same plaque) was observed in 18.9% of patients experiencing the primary endpoint and was an independent predictor of events (HR 7.54, 95% CI 3.1–18.6).
Conclusion
The simultaneous presence of four high-risk OCT plaque features was found to be associated with a higher risk of major coronary events
- …